- Two TikTok content moderators are suing the company over a lack of support in flagging toxic images.
- Both work on TikTok's proprietary system which monitors how many videos they watch.
- The suit adds to similar frustrations that tech giants are not protecting moderators from trauma.
Two former TikTok content moderators are suing the video-sharing app and its Beijing-based parent company, ByteDance, for imposing tough quotas and not providing enough mental health support to help them weed out "highly toxic and extremely disturbing images."
Ashley Velez and Reece Young allege that TikTok has broken California labor laws by exposing them to serious harm and trauma, according to a lawsuit filed by Joseph Saveri Law Firm on March 24. They are seeking class-action status for the suit. This latest lawsuit against TikTok adds to growing calls from content moderators who say that social-media giants are not doing enough to protect them against the "worst of humanity."
Both Velez and Young say they were content moderators for TikTok and instructed to review and remove graphic content from the platform.
Velez and Young are alleging in the lawsuit that the work quotas imposed on them were "oppressive."
TikTok's moderators review content through the company's proprietary software and their performance are monitored by that platform, per the claim. According to the suit, review quotas were stringent with moderators expected to work 12 hours a day with only one hour for lunch and two 15-minute breaks.
The lawsuit alleges moderators were given only 25 seconds to complete each video assessment and required to be accurate at least 80% of the time.
In addition to the grueling pace of the work, Velez and Young both claim in the suit that the job repeatedly exposed them to "highly toxic and extremely disturbing images at the workplace." According to the complaint, Velez and Young "suffered immense stress and psychological harm" after viewing images that included violence against young children, bestiality, and necrophilia.
Moreover, the moderators are expected to "keep inside the horrific things they see while reviewing content" as they're bound by non-disclosure agreements they signed before commencing work, the suit said.
"Somebody has to suffer and see this stuff, so nobody else has to," Velez told NPR.
But according to the suit, ByteDance and TikTok failed to provide adequate mental health treatment and support, and both Velez and Young were forced to pursue mental health care on their own time and at their own expense.
In a statement to NPR, TikTok said it "strives to promote a caring working environment for our employees and contractors." Its moderators "are offered a range of wellness services so that they feel supported emotionally and mentally," the company said.
Between July to September, TikTok removed some 91.4 million videos for violating guidelines, which is about 1% of all videos uploaded, according to the company. Videos were removed for violating a variety of policies, including violent extremism, hateful behavior, and harassment and bullying, the company said.
This is not the first time a social media company has been sued by its moderators. In 2020, the same law firm representing Velez and Young— Joseph Savieri — won a case against Facebook that required the social platform to pay $52 million to content moderators suffering from post-traumatic stress disorder, according to NPR.
And in December, Candie Frazier, another TikTok content moderator, filed a complaint alleging she suffered from PSTD as a result of her job. According to the suit seen by Insider, Frazier said she suffered from "horrific nightmares" and often replayed "videos that she has seen in her mind."